developing playback applications importantdue to nacl deprecation by the chromium project, tizen tv will continue its support for nacl only until 2021-year products meanwhile, tizen tv will start focusing on high-performance, cross-browser webassembly from 2020-year products this topic describes in detail how to develop a media player application using the nacl player api the topic uses the native player sample application as an example related info nacl player api samples native player native player on github the native player sample application implements some of the main nacl native client player api use cases the sample demonstrates the media data flow data download, the demuxing process, and displaying the content on the tv the sample application is designed in a modular and extensible way its source code is released under the permissive mit license, allowing you to reuse and customize the code to meet your own needs you can access the latest native player source code from the native player github repository importantthis application uses api that is now deprecated use raw pointers instead of shared_ptrs native player architecture the nacl player api is used by the nacl module all player logic is implemented in the nacl module, including user authorization, downloading and demuxing parsing media content, decrypting drm-protected content, and passing demuxed elementary stream packets to the nacl player the following figure shows the main structural layers, the components within the layers, and the data flow of a typical nacl player application external data for the nacl player application is stored in media server, which stores the media content drm server, which provides information needed to decrypt drm-protected content application is installed and run by the user it consists of 2 components html5 component, which implements the ui and handles user interaction the component has been omitted from the figure for clarity native client module component, which is responsible for the nacl player application logic communicator handles communication with the html5 component, authorizing the user's access to the requested content controller interfaces with the nacl player by, for example, requesting data download and appending media packets to the nacl player data provider downloads external data in chunks from the media server data demuxer processes media content and passes the packets to the controller downloaded content is demuxed into elementary stream packets clear content is not encrypted and can be parsed directly into elementary stream packets drm-protected content is decrypted custom drm by the native client module, and platform-supported drm by the platform platform the platform contains the implementation of the player, which is responsible for performing the functions called by the application, such as decoding packets, playing audio, and rendering video on the screen playing from url data sources the sample application supports mpeg4 media playback from a url data source, with external or internal subtitles you can show or hide external subtitles, and you can show, hide, and select internal subtitle tracks in this scenario, the platform is responsible for downloading and demuxing content the application is responsible only for playback control, using the controller and communicator components in the "inc/player/url_player/url_player_controller h" file, manage playback from the url data source using the urlplayercontroller class it aggregates all the necessary objects class urlplayercontroller public playercontroller { // private // playerlisteners listeners_; std shared_ptr<samsung naclplayer mediadatasource> data_source_; std shared_ptr<samsung naclplayer mediaplayer> player_; std unique_ptr<samsung naclplayer texttrackinfo> text_track_; std vector<samsung naclplayer texttrackinfo> text_track_list_; std shared_ptr<communication messagesender> message_sender_; // }; define the listeners in the "src/player/player/player_listeners cc" file implement the subtitlelistener class to handle subtitle events it forwards the received subtitle text and duration to the javascript application component, which is responsible for displaying the subtitle void subtitlelistener onshowsubtitle timeticks duration, const char* text { var vartext = var text ; log "got subtitle %s , duration %f", text, duration ; if auto message_sender = message_sender_ lock { message_sender->showsubtitles duration, vartext ; } } implement the mediaeventslistener class to handle player and playback events the onbufferingcomplete event notifies the javascript application component that the nacl player has buffered enough data to start playback void mediabufferinglistener onbufferingcomplete { logm "event buffering complete! now you may play " ; if auto message_sender = message_sender_ lock { message_sender->bufferingcompleted ; } // } in the "src/player/url_player/url_player_controller cc" file, initialize and register the listeners to a mediaplayer object void urlplayercontroller initplayer const std string& url, const std string& subtitle, const std string& encoding { // for external subtitles, the subtitle argument points to the location of the subtitle file // for internal subtitles, the subtitle argument must be an empty string // player_ = make_shared<mediaplayer> ; // initialize listeners and register them to the player listeners_ player_listener = make_shared<mediaplayerlistener> message_sender_ ; listeners_ buffering_listener = make_shared<mediabufferinglistener> message_sender_ ; listeners_ subtitle_listener = make_shared<subtitlelistener> message_sender_ ; player_->setmediaeventslistener listeners_ player_listener ; player_->setbufferinglistener listeners_ buffering_listener ; player_->setsubtitlelistener listeners_ subtitle_listener ; register external subtitles to the player to check whether an external subtitle url has been defined, and pass the address to the player // register external subtitles source, if defined if !subtitle empty { text_track_ = makeunique<texttrackinfo> ; int32_t ret = player_->addexternalsubtitles subtitle, encoding, *text_track_ ; // } when all the player objects are set up, initialize and construct a urldatasource object to handle a given url address, and attach it to the player void urlplayercontroller initializeurlplayer const std string& content_container_url { // data_source_ = make_shared<urldatasource> content_container_url ; player_->attachdatasource *data_source_ ; // retrieve information about the available internal and external text tracks using the urlplayercontroller posttexttrackinfo function, and send the track list to the html5 application component for showing in the ui void urlplayercontroller posttexttrackinfo { int32_t ret = player_->gettexttrackslist text_track_list_ ; if ret == errorcodes success { log "gettexttrackinfo called successfully" ; message_sender_->settexttracks text_track_list_ ; } else { log_error "gettexttrackinfo call failed, code %d", ret ; } } when the player receives the onbufferingcomplete event, start playback void urlplayercontroller play { // int32_t ret = player_->play ; // } during playback, the application receives subtitlelistener onshowsubtitle events containing subtitle text and timing information to display the subtitles, pass the subtitle event data to the html5 application component playing from elementary stream data sources the sample application supports mpeg-dash media playback from elementary stream data sources it demonstrates the following use cases playing clear mpeg-dash content with external subtitles playing playready drm-encrypted mpeg-dash content in these scenarios, the application is responsible for downloading and demuxing content, and controlling the media player data provider to play mpeg-dash content, the dashmanifest class uses the libdash library to parse a dash manifest file and download the appropriate media file or media chunk the dashmanifest class is also responsible for extracting content protection drm initialization information included in the dash manifest, through the contentprotectionvisitor class in the sample application, the "src/dash" directory contains the data provider implementation data demuxer in the sample application "src/demuxer" directory, the ffmpegdemuxer class implements the streamdemuxer interface the streamdemuxer interface supports clear and encrypted elementary stream content it demuxes the media content into elementary stream packets the ffmpegdemuxer class uses the ffmpeg library it extracts the elementary stream configuration and protection data, such as pssh box and drm initialization data, from the media content to implement the demuxer when data must be parsed, call the streamdemuxer parse function void ffmpegdemuxer parse const std vector<uint8_t>& data { // parser_thread_ message_loop postwork callback_factory_ newcallback &ffmpegdemuxer startparsing ; // } to keep the application responsive, perform the parsing in the side thread void ffmpegdemuxer startparsing int32_t { // if !streams_initialized_ initstreaminfo ; avpacket pkt; av_init_packet &pkt ; pkt data = null; pkt size = 0; while !exited_ { // unique_ptr<elementarystreampacket> es_pkt; int32_t ret = av_read_frame format_context_, &pkt ; if ret < 0 { if ret == averror_eof { exited_ = true; packet_msg = kendofstream; } // } else { es_pkt = makeespacketfromavpacket &pkt ; } if es_data_callback_ es_data_callback_ packet_msg, std move es_pkt ; av_free_packet &pkt ; } // } notethe parsed packets are delivered to the application using the es_data_callback_ function, which was registered during the streamdemuxer class initialization convert the ffmpeg elementary packet avpacket into an elementary stream packet unique_ptr<elementarystreampacket> ffmpegdemuxer makeespacketfromavpacket avpacket* pkt { auto es_packet = makeunique<elementarystreampacket> pkt->data, pkt->size ; avstream* s = format_context_->streams[pkt->stream_index]; // set es packet information es_packet->setpts totimeticks pkt->pts, s->time_base + timestamp_ ; es_packet->setdts totimeticks pkt->dts, s->time_base + timestamp_ ; es_packet->setduration totimeticks pkt->duration, s->time_base ; es_packet->setkeyframe pkt->flags == 1 ; avencinfo* enc_info = reinterpret_cast<avencinfo*> av_packet_get_side_data pkt, av_pkt_data_encrypt_info, null ; if !enc_info return es_packet; // packet encrypted es_packet->setkeyid enc_info->kid, kkidlength ; es_packet->setiv enc_info->iv, enc_info->iv_size ; for uint32_t i = 0; i < enc_info->subsample_count; ++i { es_packet->addsubsample enc_info->subsamples[i] bytes_of_clear_data, enc_info->subsamples[i] bytes_of_enc_data ; } return es_packet; } mpeg-dash content playback to implement mpeg-dash content playback in the "inc/player/es_dash_player/es_dash_player_controller h" file, manage playback from the elementary stream data source using the esdashplayercontroller class it aggregates all the necessary objects class esdashplayercontroller public playercontroller { // private // playerlisteners listeners_; std shared_ptr<samsung naclplayer mediadatasource> data_source_; std shared_ptr<samsung naclplayer mediaplayer> player_; std unique_ptr<samsung naclplayer texttrackinfo> text_track_; std shared_ptr<communication messagesender> message_sender_; std unique_ptr<dashmanifest> dash_parser_; std array<std shared_ptr<streammanager>, static_cast<int32_t> streamtype maxstreamtypes > streams_; std vector<videostream> video_representations_; std vector<audiostream> audio_representations_; // }; create the player and listeners, and assign the listeners to the player object void esdashplayercontroller initplayer const std string& mpd_file_path, const std string& subtitle, const std string& encoding { // player_ = make_shared<mediaplayer> ; listeners_ player_listener = make_shared<mediaplayerlistener> message_sender_ ; listeners_ buffering_listener = make_shared<mediabufferinglistener> message_sender_, shared_from_this ; player_->setmediaeventslistener listeners_ player_listener ; player_->setbufferinglistener listeners_ buffering_listener ; // initializesubtitles subtitle, encoding ; initializedash mpd_file_path ; } register external subtitles to the player check whether an external subtitle url has been defined, and pass the address to the player void esdashplayercontroller initializesubtitles const std string& subtitle, const std string& encoding { if subtitle empty return; text_track_ = makeunique<texttrackinfo> ; int32_t ret = player_->addexternalsubtitles subtitle, encoding, *text_track_ ; listeners_ subtitle_listener = make_shared<subtitlelistener> message_sender_ ; player_->setsubtitlelistener listeners_ subtitle_listener ; } in the "src/player/es_dash_player/es_dash_player_controller cc" file, initialize the dash parser the application must handle drm configuration because drm-specific data is not part of the dash standard, it cannot be parsed by the libdash library the drm-specific data is stored in the contentprotection dash manifest section the contentprotectionvisitor class passes the drm-specific data to the dashmanifest parsempd function void esdashplayercontroller initializedash const std string& mpd_file_path { // native player only supports playready unique_ptr<drmplayreadycontentprotectionvisitor> visitor = makeunique<drmplayreadycontentprotectionvisitor> ; dash_parser_ = dashmanifest parsempd mpd_file_path, visitor get ; // data_source_ = std make_shared<esdatasource> ; // video_representations_ = dash_parser_->getvideostreams ; audio_representations_ = dash_parser_->getaudiostreams ; // } initialize a video stream from the dash manifest void esdashplayercontroller initializevideostream samsung naclplayer drmtype drm_type { if video_representations_ empty return; // videostream s = gethighestbitratestream video_representations_ ; // if s description content_protection { // drm content detected auto drm_listener = make_shared<drmplayreadylistener> instance_, player_ ; drm_listener->setcontentprotectiondescriptor std static_pointer_cast<drmplayreadycontentprotectiondescriptor> s description content_protection ; player_->setdrmlistener drm_listener ; } // create stream manager to manage downloading and demuxing auto& stream_manager = streams_[static_cast<int32_t> streamtype video ]; stream_manager = make_shared<streammanager> instance_, streamtype video ; bool success = stream_manager->initialize dash_parser_->getvideosequence s description id , data_source_, std bind &esdashplayercontroller onstreamconfigured, this, _1 , drm_type ; // } audio streams are initialized in a similar way in the "src/player/es_dash_player/stream_manager cc" file, initialize the streammanager object it manages elementary streams by downloading the appropriate representation, demuxing it into elementary stream packets, and sending the packets to the player for playback bool streammanager impl initialize unique_ptr<mediasegmentsequence> segment_sequence, shared_ptr<esdatasource> es_data_source, std function<void streamtype > stream_configured_callback, drmtype drm_type, std shared_ptr<elementarystreamlistener> listener { // // dd a stream to esdatasource if stream_type_ == streamtype video { auto video_stream = make_shared<videoelementarystream> ; result = es_data_source->addstream *video_stream, listener ; elementary_stream_ = std static_pointer_cast<elementarystream> video_stream ; } else if stream_type_ == streamtype audio { auto audio_stream = make_shared<audioelementarystream> ; result = es_data_source->addstream *audio_stream, listener ; elementary_stream_ = std static_pointer_cast<elementarystream> audio_stream ; } // // initialize stream demuxer - create and set stream configs listeners if !initparser { log_error "failed to initialize parser or config listeners" ; return false; } return parseinitsegment ; }